Multimodal integration patterns in children
نویسندگان
چکیده
Multimodal interfaces are designed with a focus on flexibility, although very few multimodal systems currently are capable of adapting to major sources of user or environmental variation. The development of adaptive multimodal processing techniques will require empirical guidance on modeling key aspects of individual differences. In the present study, we collected data from 24 7-to-10-year-old children as they interacted using speech and pen input with an educational software prototype. A comprehensive analysis of children’s multimodal integration patterns revealed that they were classifiable as either simultaneous or sequential integrators, although they more often integrated signals simultaneously than adults. During their sequential constructions, intermodal lags also ranged faster than those of adult users. The high degree of consistency and early predictability of children’s integration patterns were similar to previously reported adult data. These results have implications for the development of temporal thresholds and adaptive multimodal processing strategies for children’s applications. The long-term goal of this research is life-span modeling of users’ integration and synchronization patterns, which will be needed to design future high-performance adaptive multimodal systems.
منابع مشابه
Integration patterns during multimodal interaction
The development of multimodal interfaces and algorithms for multimodal integration requires knowledge of integration patterns that represent how people use multiple modalities. We analyzed multimodal interaction with three different applications. Semantic analysis revealed that multimodal inputs can exhibit cooperation other than complementary and redundancy. Analysis of the relationship betwee...
متن کاملToward Adaptive Information Fusion in Multimodal Systems
Techniques for information fusion are at the heart of multimodal system design. To develop new user-adaptive approaches for multimodal fusion, our lab has investigated the stability and basis of major individual differences that have been documented in users’ multimodal integration patterns. In this talk, I summarized the following: (1) there are large individual differences in users’ dominant ...
متن کاملCombining User Modeling and Machine Learning to Predict Users' Multimodal Integration Patterns
Temporal as well as semantic constraints on fusion are at the heart of multimodal system processing. The goal of the present work is to develop useradaptive temporal thresholds with improved performance characteristics over state-of-the-art fixed ones, which can be accomplished by leveraging both empirical user modeling and machine learning techniques to handle the large individual differences ...
متن کاملCombining Semantic And Temporal Constraints For Multimodal Integration In Conversation Systems
In a multimodal conversation, user referring patterns could be complex, involving multiple referring expressions from speech utterances and multiple gestures. To resolve those references, multimodal integration based on semantic constraints is insufficient. In this paper, we describe a graph-based probabilistic approach that simultaneously combines both semantic and temporal constraints to achi...
متن کاملIntegrating Semantics into Multimodal Interaction Patterns
A user experiment on multimodal interaction (speech, hand position and hand shapes) to study two major relationships: between the level of cognitive load experienced by users and the resulting multimodal interaction patterns; and how the semantics of the information being conveyed affected those patterns. We found that as cognitive load increases, users’ multimodal productions tend to become se...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2002